We review and explore stationarity of different models.
1 MA(q)
This is always stationary.
ACF , when .
2 AR(p)
Now we discuss its stationarity.
2.1 p=1
Now .
: This is well-defined because . is independent of . We refer this to causal stationary .
: Now is not independent of , but independent of . Refer this to non-causal stationary .
: if , , this means is Gaussian white noise. When , this is Random Walk model. There is no stationary solution.
Now we use backshift notation to derive (2.2) and (2.3): . So becomes , where . So and note that so This only makes sense when , so this gives us (2.2).
And when , note that
So , and which is (2.3).
To recap,
, use first formula.
, use second formula.
, (2.4) don't make sense.
2.2 p≥1
Still use backshift notation: To use (2.4), we need to factorize the polynomial so .
We then get
If every , , then
This is a causal stationary solution. By collecting for ,
If every , , similarly
If every , , there is no stationary solution.
Summary:
If for every , there exists a unique stationary solution to (2.1).
If for every , the solution is causal.
If for some and for other , the solution is non-causal.
2.3 The Box-Jenkins Modeling Philosophy
Only work with causal stationary AR models (ARMA later)
If the data are not fitting this, difference it to fit causal stationary ARMA models.
3 ARMA(p, q) Models
Combine AR and MA models:
When , this is ; when , this is .
As usual . The parameters here are . In backshift notation: where .
It can be shown that: if has all roots with modulus strictly larger than , then has a stationary causal solution:
Denote , we can get coefficients by , i.e. and then comparing coefficients of on both sides: generalizes both and models. When , we obtain the white noise model. When , we get ; when , we get .
For ARMA, ACF and PACF are more complicated. Neither ACF nor PACF cuts off after a certain lag, if both .